2,728 research outputs found

    Key challenges in agent-based modelling for geo-spatial simulation

    Get PDF
    Agent-based modelling (ABM) is fast becoming the dominant paradigm in social simulation due primarily to a worldview that suggests that complex systems emerge from the bottom-up, are highly decentralised, and are composed of a multitude of heterogeneous objects called agents. These agents act with some purpose and their interaction, usually through time and space, generates emergent order, often at higher levels than those at which such agents operate. ABM however raises as many challenges as it seeks to resolve. It is the purpose of this paper to catalogue these challenges and to illustrate them using three somewhat different agent-based models applied to city systems. The seven challenges we pose involve: the purpose for which the model is built, the extent to which the model is rooted in independent theory, the extent to which the model can be replicated, the ways the model might be verified, calibrated and validated, the way model dynamics are represented in terms of agent interactions, the extent to which the model is operational, and the way the model can be communicated and shared with others. Once catalogued, we then illustrate these challenges with a pedestrian model for emergency evacuation in central London, a hypothetical model of residential segregation tuned to London data which elaborates the standard Schelling (1971) model, and an agent-based residential location built according to spatial interactions principles, calibrated to trip data for Greater London. The ambiguities posed by this new style of modelling are drawn out as conclusions

    Thermodynamics and kinetics of heterogeneous reactions

    Get PDF
    Thermodynamics and kinetics of sublimation, catalytic, and oxidation reaction

    Unpacking critical theories to enhance creative practice: a PhD in screenwriting case study

    Get PDF
    Drawing from my own experiences of the practice-based research degree, this article outlines some of the key principles I consider to be necessary for negotiating a PhD in the specific area of screenwriting, for both the candidate and the supervisor. Referencing my own and others' ideas of the practice-based PhD, the article places the screenwriter at the centre of its investigation, celebrating their role in the interplay between the creative and the critical; between practice and theory; between doing and thinking. It argues that just like the protagonist of a screenplay, the screenwriting PhD should take its candidate on a journey: one that improves not only craft skills, but also an understanding of what it means to write for the screen

    A vacuous screenplay in search of rigour

    Full text link
    A university professor with a reputation for creative practice research finds himself at a crossroads when, en route to an international conference, he meets a younger and somewhat modest dementia researcher whose work is clearly having an impact on people’s lives. A keynote at a creative writing conference in Hawaii, the professor is impelled to reflect on his own research practice and piece together fragments of his work history to reassure himself that what he does is not only valid as research, but also that it has rigour. With flashbacks to a variety of painful and often comic encounters with colleagues trying to articulate their practice as research, he is able to overcome his midflight, mid-career crisis and come to a renewed and satisfactory understanding of what good creative practice research is, and how that can be articulated clearly and confidently to others. Originally performed at the University of Southern Queensland’s inaugural ‘Scriptwriting as Research’ symposium in 2016, A Vacuous Screenplay in Search of Rigour thus interrogates not only the very mode of creative practice research, but also the broader (and varied) institutional research cultures within which it operates

    The impact and penetration of location-based services

    Get PDF
    Since the invention of digital technology, its development has followed an entrenched path ofminiaturisation and decentralisation with increasing focus on individual and niche applications. Computerhardware has moved from remote centres to desktop and hand held devices whilst being embedded invarious material infrastructures. Software has followed the same course. The entire process has convergedon a path where various analogue devices have become digital and are increasingly being embedded inmachines at the smallest scale. In a parallel but essential development, there has been a convergence ofcomputers with communications ensuring that the delivery and interaction mechanisms for computersoftware is now focused on networks of individuals, not simply through the desktop, but in mobilecontexts. Various inert media such as fixed television is becoming more flexible as computers and visualmedia are becoming one.With such massive convergence and miniaturisation, new software and new applications define the cuttingedge. As computers are being increasingly tailored to individual niches, then new digital services areemerging, many of which represent applications which hitherto did not exist or at best were rarely focusedon a mass market. Location based services form one such application and in this paper, we will bothspeculate on and make some initial predictions of the geographical extent to which such services willpenetrate different markets. We define such services in detail below but suffice it to say at this stage thatsuch functions involve the delivery of traditional services using digital media and telecommunications.High profile applications are now being focused on hand held devices, typically involving information onproduct location and entertainment but wider applications involve fixed installations on the desktop whereservices are delivered through traditional fixed infrastructure. Both wire and wireless applications definethis domain. The market for such services is inevitably volatile and unpredictable at this early stage but wewill attempt here to provide some rudimentary estimates of what might happen in the next five to tenyears.The ?network society? which has developed through this convergence, is, according to Castells (1989,2000) changing and re-structuring the material basis of society such that information has come todominate wealth creation in a way that information is both a raw material of production and an outcome ofproduction as a tradable commodity. This has been fuelled by the way technology has expanded followingMoore?s Law and by fundamental changes in the way telecommunications, finance, insurance, utilitiesand so on is being regulated. Location based services are becoming an integral part of this fabric and thesereflect yet another convergence between geographic information systems, global positioning systems, andsatellite remote sensing. The first geographical information system, CGIS, was developed as part of theCanada Land Inventory in 1965 and the acronym ?GIS? was introduced in 1970. 1971 saw the firstcommercial satellite, LANDSAT-1. The 1970s also saw prototypes of ISDN and mobile telephone and theintroduction of TCP/IP as the dominant network protocol. The 1980s saw the IBM XT (1982) and thebeginning of de-regulation in the US, Europe and Japan of key sectors within the economy. Finally in the 1990s, we saw the introduction of the World Wide Web and the ubiquitous pervasion of business andrecreation of networked PC?s, the Internet, mobile communications and the growing use of GPS forlocational positioning and GIS for the organisation and visualisation of spatial data. By the end of the 20thcentury, the number of mobile telephone users had reached 700 million worldwide. The increasingmobility of individuals, the anticipated availability of broadband communications for mobile devices andthe growing volumes of location specific information available in databases will inevitably lead to thedemand for services that will deliver location related information to individuals on the move. Suchlocation based services (LBS) although in a very early stage of development, are likely to play anincreasingly important part in the development of social structures and business in the coming decades.In this paper we begin by defining location based services within the context we have just sketched. Wethen develop a simple model of the market for location-based services developing the standard non-linearsaturation model of market penetration. We illustrate this for mobile devices, namely mobile phones in thefollowing sections and then we develop an analysis of different geographical regimes which arecharacterised by different growth rates and income levels worldwide. This leads us to speculate on theextent to which location based services are beginning to take off and penetrate the market. We concludewith scenarios for future growth through the analogy of GIS and mobile penetration

    A New Parameter Set for the Relativistic Mean Field Theory

    Full text link
    Subtracting the Strutinsky shell corrections from the selfconsistent energies obtained within the Relativistic Mean Field Theory (RMFT) we have got estimates for the macroscopic part of the binding energies of 142 spherical even-even nuclei. By minimizing their root mean square deviations from the values obtained with the Lublin-Srasbourg Drop (LSD) model with respect to the nine RMFT parameters we have found the optimal set (NL4). The new parameters reproduce also the radii of these nuclei with an accuracy comparable with that obtained with the NL1 and NL3 sets.Comment: Semiar given at the 10th Nuclear Physics Workshop in Kazimierz, Poland, Sep. 24-28, 200

    The metabolic syndrome adds utility to the prediction of mortality over its components: The Vietnam Experience Study

    Get PDF
    Background\ud The metabolic syndrome increases mortality risk. However, as “non-affected” individuals may still have up to two risk factors, the utility of using three or more components to identify the syndrome, and its predictive advantage over individual components have yet to be determined.\ud \ud Methods\ud Participants, male Vietnam-era veterans (n = 4265) from the USA, were followed-up from 1985/1986 for 14.7 years (61,498 person-years), and all-cause and cardiovascular disease deaths collated. Cox's proportional-hazards regression was used to assess the effect of the metabolic syndrome and its components on mortality adjusting for a wide range of potential confounders.\ud \ud Results\ud At baseline, 752 participants (17.9%) were identified as having metabolic syndrome. There were 231 (5.5%) deaths from all-causes, with 60 from cardiovascular disease. After adjustment for a range of covariates, the metabolic syndrome increased the risk of all-cause, HR 2.03, 95%CI 1.52, 2.71, and cardiovascular disease mortality, HR 1.92, 95%CI 1.10, 3.36. Risk increased dose-dependently with increasing numbers of components. The increased risk from possessing only one or two components was not statistically significant. The adjusted risk for four or more components was greater than for only three components for both all-cause, HR 2.30, 95%CI 1.45, 3.66 vs. HR 1.70, 95%CI 1.11, 2.61, and cardiovascular disease mortality, HR 3.34, 95%CI 1.19, 9.37 vs. HR 2.81, 95%CI 1.07, 7.35. The syndrome was more informative than the individual components for all-cause mortality, but could not be assessed for cardiovascular disease mortality due to multicollinearity. Hyperglycaemia was the individual strongest parameter associated with mortality.\ud \u
    • 

    corecore